Web Survey Bibliography
Recent research has shown that it is possible to improve coverage and reduce nonresponse by mixing web and mail data collection modes. Generally it is assumed that because both web and mail are visual modes, they will produce comparable data, but little empirical research has examined this assumption. Now that surveyors have the ability to relatively easily mix web and mail modes, we need to know whether or not measurement is, in fact, comparable. Open-ended questions (number boxes and text boxes) seem especially problematic because respondents often have full control over how they answer them; answers to these questions are not structured and guided in the same way as closed-ended questions with limited response options. This is especially true in self-administered surveys where there is no interviewer to probe, ensure that the desired type of answer is provided, or convert the respondent‘s answer into the desired format. In this paper, we examine item-nonresponse rates, response distributions, and the effects of questionnaire design features on a variety of open-ended questions from the Quality of Life in a Changing Nebraska (QLCN) survey. Where possible, we also examine the effects of design changes by subgroups (e.g., by respondents expected to be more or less familiar with each mode). The QLCN was conducted between July and September, 2009 (N=566; AAPOR RR1 = 46%) and contained eleven open-ended boxes where numeric information was requested and two open-ended boxes where descriptive text was requested. In addition to being randomly assigned to either the web or mail mode, respondents were randomly assigned to one of two questionnaire design treatments. Questionnaire design experiments include a question order experiment, small versus large box size on both numeric and text questions, and presence versus absence of answer box labels on numeric questions.
Conference Homepage (abstract)
Web survey bibliography - The American Association for Public Opinion Research (AAPOR) 66th Annual Conference, 2011 (26)
- The smart(phone) way to collect survey data; 2013; Stapleton, C.
- Exploring Health-related Experiences and Access to Care: Differences between Online and Telephone Survey...; 2011; Doty, M. M., Peugh, J., Shand-Lubbers, J.
- Using Community Information and Survey Methodology for Bias Reduction to Enhance the Quality of the...; 2011; Harvey, J., Prabhakaran, J., Spera, C., Zhang, Zh.
- Response Quantity, Response Quality, and Costs of Building an Online Panel via Social Contacts.; 2011; Toepoel, V.
- The Influence Of The Direction Of Likert-Type Scales In Web Surveys On Response Behavior In Different...; 2011; Keusch, F.
- An Injured Party?: A Comparison of Political Party Response Formats in Party Identification.; 2011; Schwarz, S., Barlas, F. M., Thomas, R. K., Corso, R. A., Szoc, R.
- Asking Sensitive Questions: Do They Affect Participation In Follow-Up Surveys?; 2011; Schaurer, I., Struminskaya, B., Kaczmirek, L., Bandilla, W.
- Designing Questions for Web Surveys: Effects of Check-List, Check-All, and Stand-Alone Response Formats...; 2011; Dykema, J., Schaeffer, N. C., Beach, J., Lein, V., Day, B.
- Differential Sampling Based on Historical Individual-Level Data in Online Panels.; 2011; Kelly, R. H.
- Web Survey Live Validations - What Are They Doing?; 2011; Crawford, S. D., McClain, C.
- Comparing Numeric and Text Open-End Responses in Mail and Web Surveys.; 2011; Olson, K., Smyth, J.
- Effects of Response Formats when Measuring Attitudes in Consumer Web Surveys Across Markets.; 2011; Couper, M. P., Nunge, E.
- Re-Examining the Validity of Different Survey Modes for Measuring Public Opinion in the U.S.: Findings...; 2011; Ansolabehere, S., Fraga, B., Schaffner, B. F.
- How to Survey All 14 000 Swedish Local Political Representatives And Get 10 000 Responses.; 2011; Gilljam, M., Granberg, D., Holm, B., Persson, M.
- Measuring User Satisfaction in the Lab: Questionnaire Mode, Physical Location, and Social Presence Concerns...; 2011; Jans, M., Romano, J. C., Ashenfelter, K. T., Krosnick, J. A.
- Interactive interventions in web surveys can increase response accuracy.; 2011; Conrad, F. G.
- Impact on Data Quality of Making Incentives Salient in Web Survey Invitations.; 2011; Zhang, Che.
- Effects of Mode and Incentives on Response Rates, Costs, and Response Quality in a Mixed Mode Survey...; 2011; Stevenson, J., Dykema, J., Kniss, C., Black, P., Moberg, P.
- Effects of Differential Incentives on Response Rates in Four Countries for a Web-based Follow Up Survey...; 2011; McSpurren, K.
- Completing Web Surveys on Cell-enabled iPads.; 2011; Dayton, J., Driscoll, H.
- The Social Aspect of the Digital Divide; 2011; Johnson, E. P.
- Which Technologies Do Respondents Use in Online Surveys – An International Comparison?; 2011; Kaczmirek, L., Behr, D., Bandilla, W.
- Matrix Questionnaire Design to Reduce Measurement Error; 2011; Peytchev, A., Peytcheva, E.
- Race-of-Virtual-Interviewer Effects; 2011; Conrad, F. G., Schober, M. F., Nielsen, D.
- Which Web Survey Respondents Are Most Likely to Click for Clarification?; 2011; Coiner, T., Schober, M. F., Conrad, F. G.
- Providing Clarifying Instructions in a Web Survey; 2011; Redline, C. D.